22. Solution: Pooling Intuition
Solution
The correct answer is decrease the size of the output and prevent overfitting . Preventing overfitting is a consequence of reducing the output size, which in turn, reduces the number of parameters in future layers.
Recently, pooling layers have fallen out of favor. Some reasons are:
- Recent datasets are so big and complex we're more concerned about underfitting.
- Dropout is a much better regularizer.
- Pooling results in a loss of information. Think about the max pooling operation as an example. We only keep the largest of n numbers, thereby disregarding n-1 numbers completely.